Gauss's Inequality
   HOME

TheInfoList



OR:

In
probability theory Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set o ...
, Gauss's inequality (or the Gauss inequality) gives an upper bound on the probability that a
unimodal In mathematics, unimodality means possessing a unique mode. More generally, unimodality means there is only a single highest value, somehow defined, of some mathematical object. Unimodal probability distribution In statistics, a unimodal p ...
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
lies more than any given distance from its
mode Mode ( la, modus meaning "manner, tune, measure, due measure, rhythm, melody") may refer to: Arts and entertainment * '' MO''D''E (magazine)'', a defunct U.S. women's fashion magazine * ''Mode'' magazine, a fictional fashion magazine which is ...
. Let ''X'' be a unimodal random variable with mode ''m'', and let ''τ'' 2 be the
expected value In probability theory, the expected value (also called expectation, expectancy, mathematical expectation, mean, average, or first moment) is a generalization of the weighted average. Informally, the expected value is the arithmetic mean of a l ...
of (''X'' − ''m'')2. (''τ'' 2 can also be expressed as (''μ'' − ''m'')2 + ''σ'' 2, where ''μ'' and ''σ'' are the mean and
standard deviation In statistics, the standard deviation is a measure of the amount of variation or dispersion of a set of values. A low standard deviation indicates that the values tend to be close to the mean (also called the expected value) of the set, while ...
of ''X''.) Then for any positive value of ''k'', : \Pr(, X - m, > k) \leq \begin \left( \frac \right)^2 & \text k \geq \frac \\ pt1 - \frac & \text 0 \leq k \leq \frac. \end The theorem was first proved by
Carl Friedrich Gauss Johann Carl Friedrich Gauss (; german: Gauß ; la, Carolus Fridericus Gauss; 30 April 177723 February 1855) was a German mathematician and physicist who made significant contributions to many fields in mathematics and science. Sometimes refer ...
in 1823.


Extensions to higher-order moments

Winkler in 1866 extended Gauss' inequality to ''r''th moments Winkler A. (1886) Math-Natur theorie Kl. Akad. Wiss Wien Zweite Abt 53, 6–41 where ''r'' > 0 and the distribution is unimodal with a mode of zero. This is sometimes called Camp–Meidell's inequality. : P( , X , \ge k ) \le \left( \frac \right)^r \frac \quad \text \quad k^r \ge \frac \operatorname( , X , ^r ), : P( , X , \ge k) \le \left( 1 - \left \frac \right \right) \quad \text \quad k^r \le \frac \operatorname( , X , ^r ). Gauss' bound has been subsequently sharpened and extended to apply to departures from the mean rather than the mode due to the Vysochanskiï–Petunin inequality. The latter has been extended by Dharmadhikari and Joag-Dev : P( , X , > k ) \le \max\left( \left \frac r \rightr E, X^r , , \frac s E, X^r , - \frac 1 \right) where ''s'' is a constant satisfying both ''s'' > ''r'' + 1 and ''s''(''s'' − ''r'' − 1) = ''r''''r'' and ''r'' > 0. It can be shown that these inequalities are the best possible and that further sharpening of the bounds requires that additional restrictions be placed on the distributions.


See also

* Vysochanskiï–Petunin inequality, a similar result for the distance from the mean rather than the mode *
Chebyshev's inequality In probability theory, Chebyshev's inequality (also called the Bienaymé–Chebyshev inequality) guarantees that, for a wide class of probability distributions, no more than a certain fraction of values can be more than a certain distance from th ...
, concerns distance from the mean without requiring unimodality *
Concentration inequality In probability theory, concentration inequalities provide bounds on how a random variable deviates from some value (typically, its expected value). The law of large numbers of classical probability theory states that sums of independent random vari ...
– a summary of tail-bounds on random variables.


References

* * * *{{Cite journal , doi = 10.2307/2684253 , title = The Three Sigma Rule , year = 1994 , author = Pukelsheim, F. , journal = American Statistician , volume = 48 , issue = 2 , pages = 88–91 , publisher = American Statistical Association , jstor = 2684253 Probabilistic inequalities